This is based on the natural logarithm of the number of macrostate ( W or ? ) called the thermodynamic probability.
2.
Besides the Maxwell Boltzmann distribution mentioned above, he also associated the kinetic energy of particles with their thermodynamic probability as the number of microstates corresponding to the current macrostate, he showed that its logarithm is proportional to entropy.
3.
If the probabilities in question are the thermodynamic probabilities " p i " : the ( reduced ) Gibbs entropy ? can then be seen as simply the amount of Shannon information needed to define the detailed microscopic state of the system, given its macroscopic description.
4.
The information entropy " H " can be calculated for " any " probability distribution ( if the " message " is taken to be that the event " i " which had probability " p i " occurred, out of the space of the events possible ), while the thermodynamic entropy " S " refers to thermodynamic probabilities " p i " specifically.